Skip to content

[cli] externalDocuments implemented, with typescript-operations support#10659

Merged
eddeee888 merged 1 commit intodotansimha:masterfrom
ikusakov2:feature/documentsReadOnly
Apr 12, 2026
Merged

[cli] externalDocuments implemented, with typescript-operations support#10659
eddeee888 merged 1 commit intodotansimha:masterfrom
ikusakov2:feature/documentsReadOnly

Conversation

@ikusakov2
Copy link
Copy Markdown
Contributor

Description

documentsReadOnly implemented, for safe handling of fragment includes between packages in the monorepo

Related # (10658)[https://github.com//issues/10658]

Type of change

  • New feature (non-breaking change which adds functionality)

How Has This Been Tested?

  • Unit testing

@changeset-bot
Copy link
Copy Markdown

changeset-bot bot commented Mar 31, 2026

🦋 Changeset detected

Latest commit: d25dfc0

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 6 packages
Name Type
@graphql-codegen/gql-tag-operations Minor
@graphql-codegen/visitor-plugin-common Minor
@graphql-codegen/typescript-operations Minor
@graphql-codegen/plugin-helpers Minor
@graphql-codegen/cli Minor
@graphql-codegen/client-preset Minor

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@eddeee888 eddeee888 force-pushed the feature/documentsReadOnly branch from c88a311 to 3005df7 Compare April 3, 2026 09:14
throw error;
const readOnlyHash = JSON.stringify(readOnlyPointerMap);
const outputDocumentsReadOnly = await cache(
'documents',
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A readonly document is also just a document for hashing purposes.
This ensures there is only one document in the final merged array, because we use the hash to detect the file uniqueness


async loadDocuments(pointer: Types.OperationDocument[]): Promise<Types.DocumentFile[]> {
async loadDocuments(
pointer: UnnormalizedTypeDefPointer,
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The only caller of this function will be passing in UnnormalizedTypeDefPointer, and nothing else. May as well narrow the type to make it easier to code.

Comment on lines -505 to -515
function hashDocument(doc: Types.DocumentFile) {
if (doc.rawSDL) {
return hashContent(doc.rawSDL);
}
async function addHashToDocumentFiles(
documentFilesPromise: Promise<Source[]>,
type: 'standard' | 'read-only',
): Promise<Types.DocumentFile[]> {
function hashDocument(doc: Source) {
if (doc.rawSDL) {
return hashContent(doc.rawSDL);
}

if (doc.document) {
return hashContent(print(doc.document));
}
if (doc.document) {
return hashContent(print(doc.document));
}

return null;
}
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hashDocument is is only used by one function, so I moved it inline to avoid jumping around.

documentPointers: UnnormalizedTypeDefPointer | UnnormalizedTypeDefPointer[],
config: Types.Config,
): Promise<Types.DocumentFile[]> {
): Promise<Source[]> {
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When this function finishes, we'd only get the base Source. Types.DocumentFile is what we get after processing the Source (adding hash and type)

);

return newDocuments.map((document, index) => ({
...documents[index],
Copy link
Copy Markdown
Collaborator

@eddeee888 eddeee888 Apr 3, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Context for future me:

Previously we throw away hash when optimising the doc.
When we introduce type (read-only or standard), we need it for further processing, so we need to keep it like this.

Maybe we should not just add these custom Codegen meta directly to the node, since optimizeDocuments implementation may remove it in the future.

Maybe we can add a special key/value to reduce this risk e.g. __meta: { hash: 'abc', types: 'standard' } 🤔

Comment on lines +37 to +38
hash: string;
type: 'standard' | 'read-only';
Copy link
Copy Markdown
Collaborator

@eddeee888 eddeee888 Apr 3, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Source is the default from parsing the docs.
DocumentFile is the "processed" source, with Codegen metadata.

We may want to group them to avoid edge cases mentioned in: https://github.com/dotansimha/graphql-code-generator/pull/10659/changes#r3032421307

@eddeee888 eddeee888 force-pushed the feature/documentsReadOnly branch from 8f4ba67 to df25ba7 Compare April 3, 2026 10:55
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Documents now have type being 'standard' | 'read-only'.
If we don't supply that, the plugin will generate nothing.

This utility helps by creating Types.DocumentFile with hash and type hardcoded. If we need to make the params partial with override in the future, we could,.

@eddeee888 eddeee888 force-pushed the feature/documentsReadOnly branch 2 times, most recently from 40f5ca8 to 819cf71 Compare April 4, 2026 02:59
Comment on lines +324 to +336
const populateDocumentPointerMap = (
allDocumentsDenormalizedPointers: Types.OperationDocument[],
): UnnormalizedTypeDefPointer => {
const pointer: UnnormalizedTypeDefPointer = {};
for (const denormalizedPtr of allDocumentsDenormalizedPointers) {
if (typeof denormalizedPtr === 'string') {
pointer[denormalizedPtr] = {};
} else if (typeof denormalizedPtr === 'object') {
Object.assign(pointer, denormalizedPtr);
}
}
return pointer;
};
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Inlining populateDocumentPointerMap makes it a bit easier to read as we don't jump around the file. We could take this pure function and put it at the bottom of the file if needed.

processedFile[file.hash] = true;

return prev;
}, []);
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like the reduce accumulator prev is never used — outputDocuments.push(file) writes to an external array. The return value of the entire reduce is discarded.
It works, by chance, but this is confusing. Maybe a plain for...of loop would be cleaner?

Additionally, if a document has hash: null (which hashDocument() can return when rawSDL and document are both absent), processedFile[null] will be "null" — meaning all null-hash documents get deduped down to 1, which may be wrong.

Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like the reduce accumulator prev is never used — outputDocuments.push(file) writes to an external array. The return value of the entire reduce is discarded.
It works, by chance, but this is confusing. Maybe a plain for...of loop would be cleaner?

That's true! I think I was trying to reduce the result to outputDocuments, but it's accessible inside the loop anyways. I've made it for...of in 7ec7e06

Additionally, if a document has hash: null (which hashDocument() can return when rawSDL and document are both absent), processedFile[null] will be "null" — meaning all null-hash documents get deduped down to 1, which may be wrong.

Huh, that's true, the typecheck in this repo is not strict enough. Something I'll need to configure after the major release.

I wonder when null would happen though 🤔 When Codegen is parsing an empty file?

Comment on lines +401 to +413
const processedFile: Record<string, true> = {};
const mergedDocuments = [
...outputDocumentsStandard,
...outputDocumentsReadOnly,
];
for (const file of mergedDocuments) {
if (processedFile[file.hash]) {
continue;
}
});

outputDocuments = result.documents;
outputDocuments.push(file);
processedFile[file.hash] = true;
}
Copy link
Copy Markdown
Collaborator

@eddeee888 eddeee888 Apr 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've considered forwarding documentsReadOnly as-is but hit a few annoying cases:

  • documents are processed a lot after the initial loading stage here. If we kept documentsReadOnly separate, we'd need to run the same functions over them
  • Plugins use positional param, which means documentsReadOnly need to be at the last position to avoid a breaking change. This works, but feels disconnected

For this reason, I feel it's easier if we merge them as early as possible, and run the same processing over both types, and plugins like typescript-operations can handle the filtering without having to access another param

@eddeee888 eddeee888 changed the title documentsReadOnly implemented [cli] externalDocuments implemented Apr 11, 2026
@eddeee888 eddeee888 changed the title [cli] externalDocuments implemented [cli] externalDocuments implemented, with typescript-operations support Apr 11, 2026
@eddeee888 eddeee888 force-pushed the feature/documentsReadOnly branch 7 times, most recently from a2dd1f3 to f35cb96 Compare April 11, 2026 15:39
- `externalDocuments` is used for as references during code generation. Said documents are often fragments that need to be inlined to the documents types/ast. However, the process should not generate types/ast for docs in externalDocuments.
- Add changeset
- Update generated files
@eddeee888 eddeee888 force-pushed the feature/documentsReadOnly branch from f35cb96 to d25dfc0 Compare April 11, 2026 15:47
@eddeee888 eddeee888 merged commit e65d303 into dotansimha:master Apr 12, 2026
20 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants